Indirect Encoding Evolutionary Learning Algorithm for the Multilayer Morphological Perceptron
نویسندگان
چکیده
This article describes an indirectly encoded evolutionary learning algorithm to train morphological neural networks. The indirect encoding method is an algorithm in which the training of the neural network is done by finding the solution without considering the exact connectivity of the network. Looking for the set of weights and architecture in a reduced search space, this simple, but powerful training algorithm is able to evolve to a feasible solution using up to three layers required to perform the pattern classification. This type of representation provides the necessary compactness required by large networks. The algorithm was tested using Iris Fisher data and a prototype was written using Matlab. Introduction Morphological Neural Networks (MNN) are a new type of neural networks described by Ritter, Sussner, and Beavers (Ritter and Sussner 1996), (Ritter and Sussner 1997), (Sussner 1998), and (Ritter and Beavers 1999). These types of neural networks replace the classical operations of multiplication and addition by addition and maximum or minimum operations. The maximum and minimum operations allow performing a nonlinear operation before the application of the activation or transfer function. MNN utilize algebraic lattice operations structure known as semiring ( , , , , ') ±∞ ∨ ∧ + + R , different from traditional neural networks that are based on the algebraic structure known as ring (R,+,×). The operations ∧ and ∨ denote minimum and maximum binary operations, respectively. Genetic Algorithms (Yao 1999) have proven to be effective to search for an optimal solution in very large, complex, and irregular search spaces such as the neural networks architectures. This article describes a method using genetic algorithms that can be used to train the morphological neural networks introduced by Ritter, Sussner and Beavers. The algorithm can be used to train up to three layers morphological perceptron architectures based on evolutionary computation, which are able to classify most traditional pattern classification problems. Copyright © 2004, American Association for Artificial Intelligence (www.aaai.org). All rights reserved. Morphological Neural Networks Morphological neural networks are a new type of neural network, based on lattice operations. The morphological neuron follows the mathematical model described by Equation 1, ( ) 1 n j ij i ij i f p r x w = ⎛ ⎞ ⋅ ∨ + ⎜ ⎟ ⎝ ⎠ (1) where ∨ is the maximum operator (or minimum operator ∧ can be used), xi is the i-th input value for the j-th neuron, wij denotes the synaptic weight associated between the i-th input and the j-th neuron, rij represents the inhibitory or excitatory pre-synaptic value between the i-th input and the j-th neuron, and pj represents the postsynaptic response of the j-th neuron. Both rij and pj can assume values of {+1, -1}. In addition, the morphological perceptron uses a special hard-limit transfer function, as shown in Equation 2: : 0,1 1 if x > 0 0 else f x → ⎧ → ⎨ ⎩ (2) Figure 1 shows a graphical representation of a two-layer morphological neural network.
منابع مشابه
Direct Encoding Evolutionary Learning Algorithm for Multilayer Morphological Perceptron
This paper presents a method based on evolutionary computation to train multilayer morphological perceptron (MLMP). The algorithm calculates network parameters such as its connection weights, pre-synaptic and postsynaptic values for a given network topology. Morphological perceptron are a new type of feed-forward artificial neural network based on lattice algebra which can be used for pattern c...
متن کاملStable On - Line Evolutionary Learning of NN - MLPQiangfu
1371 Stable On-Line Evolutionary Learning of NN-MLP Qiangfu Zhao Abstract| To design the nearest neighbor based multilayer perceptron (NN-MLP) e ciently, the author has proposed a non-genetic based evolutionary algorithm called the R4|rule. For o -line learning, the R4|rule can produce the smallest or nearly smallest networks with high generalization ability by iteratively performing four basic...
متن کاملEvolutionary learning of nearest-neighbor MLP
The nearest-neighbor multilayer perceptron (NN-MLP) is a single-hidden-layer network suitable for pattern recognition. To design an NN-MLP efficiently, this paper proposes a new evolutionary algorithm consisting of four basic operations: recognition, remembrance, reduction, and review. Experimental results show that this algorithm can produce the smallest or nearly smallest networks from random...
متن کاملApplication of ensemble learning techniques to model the atmospheric concentration of SO2
In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...
متن کاملLearning Algorithms for Small Mobile Robots: Case Study on Maze Exploration
An emergence of intelligent behavior within a simple robotic agent is studied in this paper. Two control mechanisms for an agent are considered — new direction of reinforcement learning called relational reinforcement learning, and a radial basis function neural network trained by evolutionary algorithm. Relational reinforcement learning is a new interdisciplinary approach combining logical pro...
متن کامل